ACS Publications Division - to list of Journals
Pharmaceutical Centuryto Pharmaceutical Century home
Analytical Chemistry | Chemical & Engineering News | Modern Drug Discovery
| Today's Chemist at Work | E-Mail Us | Electronic Readers Service 
1990s
opening art
Corporate Sponsors
Biotage
ChemBridge Corp.
Dionex
Lab Vantage Solutions
Pisgah Labs, Inc.
Procter & Gamble Chemicals
Shearwater Polymers, Inc.
ThermoQuest
Vinchem
Harnessing genes, recasting flesh

Introduction   In the 1990s, the Human Genome Project took off like a rocket, along with many global genomic initiatives aimed at plants, animals, and microorganisms. Gene therapy developed, as did the potential of human cloning and the use of human tissues as medicine. The new technologies provided hope for solving the failures of the old through a new paradigm—one that was more complex, holistic, and individualized. The changing view became one of medicines and therapies based on knowledge, not trial and error; on human flesh, not nature’s pharmacy.

The new therapeutic paradigm evolved from an earlier promise of hormone drugs and the flowering of intelligent drug design. It grew from an understanding of receptors and from breakthroughs in genomics and genetic engineering. It found favor in the new power of combinatorial chemistry and the modeling and data management abilities of the fastest computers. Hope was renewed for a next generation of magic bullets born of human genes. As the 1990s ended, many genetically based drugs were in clinical trials and a wealth of genome sequences, their promise as yet unknown, led scientists to reason that the 21st century would be the biotech century.

Computers and combinatorial technology
Robotics and automation allowed researchers to finally break through a host of constraints on rational drug design. Achievements in miniaturization in robotics and computer systems allowed the manipulation of many thousands of samples and reactions in the time and space where previously only a few could be performed. They permitted the final transformation of pharmacology from the tedious, hit-and-miss science based primarily on organic synthesis to one based firmly on physiology and complex biochemistry, allowing explosive movement into rational drug discovery in both laboratory design and natural-product surveys. (And even when the technology remained hit-and-miss because of the lack of a compatible knowledge base, the sheer quantity of samples and reactions now testable created efficiencies of scale that made the random nature of the process extraordinarily worthwhile.)

Combinatorial chemists produce libraries of chemicals based on the controlled and sequential modification of generally immobilized or otherwise tagged chemical starting blocks. These original moieties are chosen, under optimal knowledge conditions, for their predicted or possible behavior in a functional drug, protein, polymer, or pesticide. Developing the knowledge base for starting materials proved to be one of the greatest benefits of computer modeling of receptors and the development of computational libraries (multiple structures derived from computer algorithms that analyze and predict potentially useful sequences from databases of gene or protein sequences, or structural information from previous drugs). Here the search for natural products remained critical—for the discovery of new starting places.

Finding new drug starting places, as much as finding drugs that were useful “as is,” became important in the 1990s as companies tried to tap into the knowledge base of traditional medical practitioners in the developing world through collaboration rather than appropriation. In this manner, several companies promoted the analysis of the biodiversity available in tropical rainforests and oceans. This “added value” of biodiversity became both the rallying cry for environmentalists and a point of solidarity for political muscle-building in the developing world. As the industrialized world’s demand for new drugs (and associated profits) increased, developing nations sought to prevent the perceived exploitation of their heritage.

And just as previous sources of drugs from the natural world were not wholly ignored (even as the human genome became the “grail” of modern medical hopes), combinatorial approaches created a new demand for the services of traditional organic and analytical chemistry. Although computer models were beneficial, new wet chemistry techniques still had to be defined on the basis of the new discoveries in genomics and proteomics. They had to be modified for microsystems and mass production—for the triumph of the microtiter plate over the flask, the test tube, and the beaker.

Drugs were still chemical products after all.

High-throughput screening
The vast increase in the number of potential drugs produced through combinatorial methods created a new bottleneck in the system—screening and evaluating these libraries, which held hundreds of thousands of candidates. To conduct high-throughput screening (HTS) on this excess of riches, new and more automated assays were pressed into service. Part of this move into HTS was due to the burgeoning of useful bioassays, which permitted screening by individual cells and cell components, tissues, engineered receptor proteins, nucleic acid sequences, and immunologicals. New forms of assays proliferated, and so did opportunities for evaluating these assays quickly through new technologies.

Researchers continued to move away from radioactivity in bioassays, automated sequencing, and synthesis by using various new tagging molecules to directly monitor cells, substrates, and reaction products by fluorescence or phosphorescence. Fluorescence made it possible to examine, for the first time, the behavior of single molecules or molecular species in in vivo systems. Roger Tsien and colleagues, for example, constructed variants of the standard green fluorescent protein for use in a calmodulin-based chimera to create a genetic-based, fluorescent indicator of Ca2+. They called this marker “yellow chameleon” and used it in transgenic Caenorhabditis elegans to follow calcium metabolism during muscle contraction in living organisms.

Of particular promise was the use of such “light” technologies with DNA microarrays, which allowed quantitative analysis and comparison of gene expression by multicolor spectral imaging. Many genes are differentiated in their levels of expression, especially in cancerous versus normal cells, and microarray techniques showed promise in the discovery of those differentiated genes. Microarrays thus became a basic research tool and a highly promising candidate for HTS in drug development.

Genomics meets proteomics
Knowledge of details of the genetic code, first learned during the 1960s, achieved practical application during the 1990s on a previously unimaginable scale. Moving from gene to protein and back again provided an explosion of information as the human (and other) genome projects racked up spectacular successes. Planned at the end of the 1980s, the U.S. Human Genome Project and the world Human Genome Organization led the way. Started first as a collection of government and university collaborations, the search for the human genome was rapidly adopted by segments of industry. The issue of patenting human, plant, and animal genes would be a persistent controversy.

Inspired by this new obsession with genomics, the 1990s may ultimately be best known for the production of the first complete genetic maps. The first full microorganism genome was sequenced in 1995 (Haemophilus influenza, by Craig Venter and colleagues at The Institute for Genomic Research). This achievement was followed rapidly by the genome sequencing of Saccharomyces cerevisiae (baker’s yeast) in 1996; Escherichia coli, Borrelia burgdorferi, and Helicobacter pylori in 1997; the nematode C. elegans in 1998; and the first sequenced human chromosome (22) in 1999. The entrance of industry into the race to sequence the human genome at the tail end of the decade sped up the worldwide effort, albeit amid controversy. 
BIOGRAPHY: The new human
Perhaps the most significant and controversial “individual” in the 1990s was the new human—a creation of gene sequences, manipulable, copyable, and (to many) patentable in all its parts. She was the “human” of the Human Genome Project. The genetic determinism enthroned in this new persona became a powerful paradigm, from the discovery of the latest disease gene of the month, to the explosion of DNA fingerprinting and diagnostics, to a renewed belief in genes for homosexuality, violence, intelligence, and all manner of psychological states. The new human was a research tool transformed into social, political, economic, and philosophical dogma, the controversial ramifications of which will inevitably play out in the 21st century. In 1995, the Visible Human Project provided an apt symbol for this new paradigm—the human body, male and female, sliced and electromagnetically diced and viewable on a host of Web sites from every angle. To some, it was the ultimate sacrilege; to others, the flowering of the new technology.

Hot on the heels of the genomics “blastoff” was the development of proteomics—the science of analyzing, predicting, and using the proteins produced from the genes and from the cellular processing performed on these macromolecules before they achieve full functionality in cells.

Both proteomics and genomics rely on bioinformatics to be useful. Bioinformatics is essentially the computerized storage and analysis of biological data, from standard gene sequence databases (such as the online repository GenBank maintained by the NIH) to complex fuzzy logic systems such as GRAIL (developed in 1991 by Edward Eberbacher of Oak Ridge National Laboratory). GRAIL and more than a dozen other programs were used to find prospective genes in genomic databases such as GenBank by employing various pattern recognition techniques.

Pattern recognition techniques were also at the heart of the new DNA microarrays discussed above, and they were increasingly used to detect comparative patterns of gene transcription in cells under various conditions and states (including diseased vs healthy).

Human biotechnology
In September 1990, the first human gene therapy was started by W. French Anderson at NIH in an attempt to cure adenosine deaminase (ADA) deficiency—referred to as bubble-boy syndrome—by inserting the correct gene for ADA into an afflicted four-year-old girl. Although the treatment did not provide a complete cure, it did allow the young patient to live a more normal life with supplemental ADA injections.

Other attempts at gene therapy also remained more promising than successful. Clinical trials on humans were disappointing compared with the phenomenal successes in mice, although limited tumor suppression did occur in some cancers, and there were promising reports on the treatment of hemophilia. Jesse Gelsinger, a teenager who suffered from the life-threatening liver disorder ornithine transcarbamylase deficiency, volunteered for adenovirus-delivered gene therapy at a University of Pennsylvania clinical trial in 1999. His subsequent death sent a shock wave through the entire research community, exposed apparent flaws in regulatory protocols and compliance, and increased public distrust of one more aspect of harnessing genes.

Using gene products as drugs, however, was a different story. From recombinant human insulin sold in the 1980s to the humanized antibodies of the 1990s, the successes of harnessing the human genome—whether “sensibly” (in the case of the gene product) or by using antisense techniques as inhibitors of human genes (in 1998 Formivirsen, used to treat cytomegalovirus, became the first approved antisense therapeutic)—proved tempting to the research laboratories of most major pharmaceutical companies. Many biotechnology medicines—from erythropoietin, tumor necrosis factor, dismutases, growth hormones, and interferons to interleukins and humanized monoclonal antibodies—entered clinical trials throughout the decade.

Beginning in the 1990s, stem cell therapy held great promise. This treatment uses human cells to repair and ameliorate inborn or acquired medical conditions, from Parkinson’s disease and diabetes to traumatic spinal paralysis. By 1998, embryonic stem cells could be grown in vitro, which promised a wealth of new opportunities for this precious (and controversial) resource.

Promising too were the new forms of tissue engineering for therapeutic purposes. Great strides were made in tissue, organ, and bone replacements. The demand for transplants, growing at 15% per year by the end of the decade, led to the search for appropriate artificial or animal substitutes. Cartilage repair systems, such as Carticel by Genzyme Tissue Repair, became commonplace. Patients’ cells shipped to the company were treated with Carticel, cultured, and subsequently reimplanted. Second-generation products permitted autologous cells to be cultured on membranes, allowing tissue formation in vitro. Several companies focused on developing orthobiologics—proteins, such as growth factors, that stimulate the patient’s ability to regenerate tissues. 
SOCIETY: Fighting the harness
Dolly, the clonal sheep created from the cells of an adult ewe in 1996, was a bit of agricultural biotechnology that staggered the world—and in so doing provided some of the best evidence of the powerful fears raised by the new genetics and the concept of the new human. Governments and the public reacted with horror, seeing what might be a boon to pharmacology as the ultimate bane to religion and philosophy. Laws were passed protecting “human individuality” and human tissues, fueled in part by antiabortion sentiments as much as fears of the new technology.

By the end of the decade, some of the most vociferous protests against harnessing genetics were because of perceived risks of the use and consumption of genetically modified (GM) foods. GM products were claimed to threaten the food chain by denaturalizing and disturbing the environment. Companies such as Monsanto faced worldwide opposition to the development and deployment of transgenic crops. At times, such protests even peripherally damaged the commercial development of transgenics for medicinal purposes. Transgenic plants and animals had literally become a growing source of vaccines, human gene products, and nutritional supplements. This development was part of the movement toward “nutraceuticals”—food crops that contain added value in the form of medicine. But despite the bad press, by the end of the century—in the United States at least—most of the corn and at least a third of the soybeans planted were genetically engineered with either herbicide or insect resistance. The medical uses of transgenic plants and animal products continued to expand, a phenomenon especially driven by the benefits of the new cloning technology.

Epicel, a graft made from autologous cells, was also developed by Genzyme Tissue Repair to replace the skin of burn victims with greater than 50% skin damage. In 1998, Organogenesis introduced the first FDA-approved, ready-to-order Apligraf human skin replacement, which was made of living human epidermal keratinocytes and dermal fibroblasts. Also undergoing research in 1999 was Vitrix soft tissue replacement—which was made of fibroblasts and collagen. By the end of the decade, artificial liver systems (which work outside the body) were developed as temporary blood cleansers providing detoxification and various digestion-related processes. In many cases, such treatments allowed the patient’s own liver to regenerate during the metabolic “rest.”

Such uses of cells and tissues raised numerous ethical questions, which were galvanized in the media by the 1996 arrival of the clonal sheep Dolly. Reaction against the power of the new biotechnology was not restricted to fears of dehumanizing humanity through “xeroxing.” The possibility of routine xenotransplantation (using animal organs as replacements in humans) came to the fore with advances in immunology and genetic engineering that promised the ability to humanize animal tissues (specifically, those of pigs) in ways similar to the development of humanized antibodies in mice.

The issue of xenotransplantation not only raised fears of new diseases creeping into the human population from animal donors, but was seen by some as a further degradation of human dignity either intrinsically or through the misuse of animals. The animal rights lobby throughout the decade argued passionately against the use of animals for human health purposes.

The Red Queen’s race
Continuing the problems seen in the 1980s, old and new diseases were increasingly immune to the array of weapons devised against them. Like Alice in Through the Looking Glass, drug researchers had to run as fast as they could just to keep up in the race against bacterial resistance to traditional antibiotics (see Chapter 7). As the decade progressed, more diseases became untreatable with the standard suite of drugs. Various streptococcal infections, strains of tuberculosis bacteria, pathogenic E. coli, gonorrhea, and the so-called flesh-eating bacteria—necrotizing fasciitis, most commonly caused by group A streptococcus—all became immune to previously successful magic bullets. Patients died who earlier would have lived.

As the problem manifested, pharmaceutical, software, and instrument companies alike turned to combinatorial chemistry and HTS technologies in an effort to regain the racing edge.

AIDS remained a profoundly disturbing example of the failure of technology to master a disease, despite the incredible advances in understanding its biology that took place in the 1990s. Vaccine efforts occupied much of the popular press, and genetically engineered viral fragments seemed to be the best hope. But the proliferation of viral strains erased hope for a single, easy form of vaccination. Resistance to AZT therapy increased, and even the new protease inhibitors and so-called drug cocktails developed in the 1990s proved to be only stopgap measures as viral strains appeared that were resistant to everything thrown at them.

Individual lives were prolonged, and the death rate in Western countries, where the expensive new drugs were available, dropped precipitously. But the ultimate solution to aids had not been found, nor even a countermeasure to its spread in the developing world and among poor populations of industrialized nations.

The middle of the decade saw a resurgence of the “old” plague (bubonic) in India, and even polio remained a problem in the developing world. In Africa, Ebola resurfaced in scattered outbreaks—although it was nothing compared with the continental devastation caused by aids. In the rest of the world, fears of biological warfare raised by the Gulf War continued. Vaccination efforts were stepped up for many diseases. The Assembly of the World Health Organization set a global goal in 1990 of a 95% reduction in measles deaths in 1995 compared with pre-immunization levels. By the deadline, estimated global coverage for measles vaccine had reached 78%, at the same time as the industrialized world experienced a backlash against vaccination because of concerns about adverse side effects.

Vaccine technology continued to improve with the development of recombinant vaccines for several diseases, new efforts to produce vaccines for old scourges such as malaria, and new nasal delivery systems that stimulated the mucosal-associated antibody system.

DNA vaccines—the injection of engineered plasmids into human cells to stimulate antigen production and immunization—were first described in a 1989 patent and published in 1990 by Wolff, Malone, Felgner, and colleagues. They entered clinical trials in 1995. Although one editor of Bio/Technology called this the Third Vaccine Revolution, by the end of the decade the reality of this expansive claim remained in doubt, especially because of the continuing debate over genetic engineering.

Efforts to develop food-based vaccines through the production of transgenics engineered with the appropriate antigens continued. This research stimulated studies on mucosal immunity and efforts to enable complex proteins to cross the gut–blood barrier.

Even with these technological developments, the decade ended with the negatives of ever-expanding disease problems, exacerbated by predictions that global warming would lead to new epidemics of insectborne and tropical diseases. However, a note of optimism remained that rational design and automated production technologies would ultimately be able to defeat these diseases.

High tech and new mech
To improve the front end of rational drug design and to foster the use and growth of knowledge bases in genomics and proteomics, many old and new technologies were adapted to pharmaceutical use in the 1990s.

In the 1990s, the use of mass spectrometry (MS) for bioanalytical analysis underwent a renaissance, with improvements such as ultrahigh-performance ms using Fourier-transform ion cyclotron resonance (FT-ICR MS) and tandem-in-time (multidimensional) MS for biological macromolecules. Improved techniques such as peak-parking (reducing the column flow rate into the mass spectrometer the instant a peak is detected, to allow samples to be split and analyzed by multiple MS nearly simultaneously) added several dimensions that were previously impossible. These changes improved the ability to analyze the complex mixtures required in studies of cellular metabolism and gene regulation. Results from multidimensional runs were analyzed by increasingly sophisticated bioinformatics programs and used to improve their knowledge base. In combination with HPLC and various capillary electrophoretic systems, ms became part of a paradigm for pharmaceutical R&D as a valuable new approach for identifying drug targets and protein function. 
TECHNOLOGY: Let there be light
One of the most enabling technologies for biological analysis developed in the 1990s uses one of the most basic sensory inputs—visible light. New fluorescence and phosphorescence techniques were developed for gene and protein sequencing, in vivo and in vitro activity analysis, DNA microarrays, and even the evaluation of transgenic animals and plants. No longer must researchers rely on the radioactive traces, chemical precipitation, or electrical properties that recent technologies have used. Rather, they hark back to an almost primitive reliance on color, like the biological stains and dyes used in early microscopy. Although computers are used to provide sensitive quantification of color differentials, the eye of the scientist once more keys on colored dots and spots, glowing cells and organelles, and colored traces on recording strips. For the synthetic drug industry, founded in coal-tar dyes before the Pharmaceutical Century began, it might be called a return to colored roots.

Equivalently, the development of multidimensional NMR techniques, especially those using more powerful instruments (e.g., 500-MHz NMR) opened the door to solving the structure of proteins and peptides in aqueous environments, as they exist in biological systems. The new NMR techniques allowed observations of the physical flexibility of proteins and the dynamics of their interactions with other molecules—a huge advantage in studies of a protein’s biochemical function, especially in receptors and their target molecules (including potential drugs).

By viewing the computer-generated three-dimensional structure of the protein, which was made possible by the data gathered from these instruments, the way in which a ligand fits into a protein’s active site could be directly observed and studied for the first time. The three-dimensional structure provided information about biological function, including the catalysis of reaction and binding of molecules such as DNA, RNA, and other proteins. In drug design, ligand binding by a target protein was used to induce the ultimate effects of interest, such as cell growth or cell death.

By using new technologies to study the structure of the target protein in a disease and learn about its active or ligand-binding sites, rational drug design sought to design inhibitors or activators that elicited a response. This correlation between structure and biological function (known as the structure–activity relationship, or SAR) became a fundamental underpinning of the revolution in bioinformatics. In the 1990s, the SAR was the basis by which genomics and proteomics were translated into pharmaceutical products.

The “new” Big Pharma
Ultimately, the biggest change in the pharmaceutical industry, enabled by the progression of technologies throughout the century and culminating in the 1990s, was the aforementioned transformation from a hit-and-miss approach to rational drug discovery in both laboratory design and natural-product surveys.

A new business atmosphere, first seen in the 1980s and institutionalized in the 1990s, revealed itself. It was characterized by mergers and takeovers, and by a dramatic increase in the use of contract research organizations—not only for clinical development, but even for basic R&D. Big Pharma confronted a new business climate and new regulations, born in part from dealing with world market forces and protests by activists in developing countries.

Marketing changed dramatically in the 1990s, partly because of a new consumerism. The Internet made possible the direct purchase of medicines by drug consumers and of raw materials by drug producers, transforming the nature of business. Direct-to-consumer advertising proliferated on radio and TV because of new FDA regulations in 1997 that liberalized requirements for the presentation of risks of medications on electronic media compared with print.

The phenomenal demand for nutritional supplements and so-called alternative medicines created both new opportunities and increased competition in the industry—which led to major scandals in vitamin price-fixing among some of the most respected, or at least some of the biggest, drug corporations. So powerful was the new consumer demand that it represented one of the few times in recent history that the burgeoning power of the FDA was thwarted when the agency attempted to control nutritional supplements as drugs. (The FDA retained the right to regulate them as foods.)

Promise and peril
At the start of the Pharmaceutical Century, the average life expectancy of Americans was 47. At century’s end, the average child born in the United States was projected to live to 76. As the 1900s gave way to the 2000s, biotechnology provided the promise of even more astounding advances in health and longevity. But concomitant with these technological changes was a sea change in the vision of what it was to be a human being.

In the 19th century, the natural world was the source of most medicines. With the dawn of magic bullets in the 20th century, complex organic chemistry opened up a world of drugs created in the laboratory, either modified from nature or synthesized de novo. As the Pharmaceutical Century progressed, from the first knowledge of human hormones to the discovery of the nature of genes and the tools for genetic engineering, the modern paradigm saw a recasting of what human flesh was for—suddenly it was a source of medicines, tissues, and patentable knowledge. Humankind, not the natural world, became the hoped-for premier source of drug discovery. 
Suggested reading
  • Biotechnology: Science, Engineering and Ethical Challenges for the Twenty-First Century, Rudolph, F. B., and McIntire, L. V., Eds. (National Academy Press: Washington, DC, 1996) 
  • A Practical Guide to Combinatorial Chemistry, Czarnik, A. W., and Dewitt, S. H., Eds. (American Chemical Society: Washington, DC, 1997) 
  • Molecular Biology and Biotechnology: A Comprehensive Desk Reference, Meyers, R. A., Ed. (John Wiley & Sons: New York, 1995) 
  • The Organic Chemistry of Drug Design and Drug Action, Silverman, R. (Academic Press: San Diego, 1999) 
  • Using Antibodies: A Laboratory Manual, Harlow, E., and Lane, D. (Cold Spring Harbor Laboratory Press: Cold Spring Harbor, NY, 1999) 

With genes and chemicals suddenly capable of manipulating the warp and woof of the human loom—both mind and body alike—the human pattern seemed to become fluid to design. According to pessimists, even if biotechnology were not abused to create “superhumans,” pharmaceuticals and health care could become the greatest differentiators of human groups in history—not by genetic races, but by economic factors. The new knowledge was found in the highest and most expensive technologies, and often in the hands of those more interested in patents than panaceas.

Yet with this peril of inequality comes the promise of human transformation for good. According to optimists, biotechnology—especially the use of transgenic plants and animals for the production of new drugs and vaccines, xenotransplantation, and the like—promises cheaper, more universal health care. Meanwhile, lifestyle drugs—pharmaceuticals for nonacute conditions such as sterility, impotence, and baldness—also have emerged as a fast-growing category.

From aspirin to Herceptin—a monoclonal antibody that blocks the overexpressed Her2 receptor in breast cancer patients—from herbal medicines to transgenic plants, from horse serum to xenotransplantation, from animal insulin to recombinant human growth hormone, the Pharmaceutical Century was one of transformation. It is too soon to predict what pattern the process will weave, but the human loom has gone high tech. The 21st century will be a brave new tapestry.


© 2000 American Chemical Society
Analytical Chemistry Chemical and Engineering News Modern Drug Discovery Today's Chemist at Work
CASChemPortChemCenterPubs Page